Improving the Performance of the PNLMS Algorithm Using l1 Norm Regularization

نویسندگان

  • Rajib Lochan Das
  • Mrityunjoy Chakraborty
چکیده

The proportionate normalized least mean square (PNLMS) algorithm and its variants are by far the most popular adaptive filters that are used to identify sparse systems. The convergence speed of the PNLMS algorithm, though very high initially, however, slows down at a later stage, even becoming worse than sparsity agnostic adaptive filters like the NLMS. In this paper, we address this problem by introducing a carefully constructed l1 norm (of the coefficients) penalty in the PNLMS cost function which favors sparsity. This results in certain zero attracting terms in the PNLMS weight update equation which help in the shrinkage of the coefficients, especially the inactive taps, thereby arresting the slowing down of convergence and also producing lesser steady state excess mean square error (EMSE). A rigorous convergence analysis of the proposed algorithm is presented that expresses the steady state mean square deviation of both the active and the inactive taps in terms of a zero attracting coefficient of the algorithm. The analysis reveals that further reduction of the EMSE is possible by deploying a variable step size (VSS) simultaneously with a variable zero attracting coefficient in the weight update process. Simulation results confirm superior performance of the proposed VSS zero attracting PNLMS algorithm over existing algorithms, especially in terms of having both higher convergence speed and lesser steady state EMSE simultaneously.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

An improved proportionate NLMS algorithm based on the l0 norm

The proportionate normalized least-mean-square (PNLMS) algorithm was developed in the context of network echo cancellation. It has been proven to be efficient when the echo path is sparse, which is not always the case in realworld echo cancellation. The improved PNLMS (IPNLMS) algorithm is less sensitive to the sparseness character of the echo path. This algorithm uses the l1 norm to exploit sp...

متن کامل

Zero Attracting PNLMS Algorithm and Its Convergence in Mean

The proportionate normalized least mean square (PNLMS) algorithm and its variants are by far the most popular adaptive filters that are used to identify sparse systems. The convergence speed of the PNLMS algorithm, though very high initially, however, slows down at a later stage, even becoming worse than sparsity agnostic adaptive filters like the NLMS. In this paper, we address this problem by...

متن کامل

Subband Adaptive Filter Exploiting Sparsity of System

This paper presents a normalized subband adaptive filtering (NSAF) algorithm to cope with the sparsity condition of an underlying system in the context of compressive sensing. By regularizing a weighted l1-norm of the filter taps estimate onto the cost function of the NSAF and utilizing a subgradient analysis, the update recursion of the l1-norm constraint NSAF is derived. Considering two disti...

متن کامل

Iterative Reweighted Noninteger Norm Regularizing SVM for Gene Expression Data Classification

Support vector machine is an effective classification and regression method that uses machine learning theory to maximize the predictive accuracy while avoiding overfitting of data. L2 regularization has been commonly used. If the training dataset contains many noise variables, L1 regularization SVM will provide a better performance. However, both L1 and L2 are not the optimal regularization me...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE/ACM Trans. Audio, Speech & Language Processing

دوره 24  شماره 

صفحات  -

تاریخ انتشار 2016